High Dimensional Bayesian Optimization with Elastic Gaussian Process
نویسندگان
چکیده
Bayesian optimization depends on solving a global optimization of a acquisition function. However, the acquisition function can be extremely sharp at high dimension having only a few peaks marooned in a large terrain of almost flat surface. Global optimization algorithms such as DIRECT are infeasible at higher dimensions and gradient-dependent methods cannot move if initialized in the flat terrain. We propose an algorithm that enables local gradient-dependent algorithms to move through the flat terrain by using a sequence of gross-to-finer Gaussian process priors on the objective function. Experiments clearly demonstrate the utility of the proposed method at high dimension using synthetic and real-world case studies.
منابع مشابه
Active Learning of Linear Embeddings for Gaussian Processes
We propose an active learning method for discovering low-dimensional structure in highdimensional Gaussian process (GP) tasks. Such problems are increasingly frequent and important, but have hitherto presented severe practical difficulties. We further introduce a novel technique for approximately marginalizing GP hyperparameters, yielding marginal predictions robust to hyperparameter misspecifi...
متن کاملBatched Large-scale Bayesian Optimization in High-dimensional Spaces
Bayesian optimization (BO) has become an effective approach for black-box function optimization problems when function evaluations are expensive and the optimum can be achieved within a relatively small number of queries. However, many cases, such as the ones with high-dimensional inputs, may require a much larger number of observations for optimization. Despite an abundance of observations tha...
متن کاملHigh-Dimensional Gaussian Process Bandits
Many applications in machine learning require optimizing unknown functions defined over a high-dimensional space from noisy samples that are expensive to obtain. We address this notoriously hard challenge, under the assumptions that the function varies only along some low-dimensional subspace and is smooth (i.e., it has a low norm in a Reproducible Kernel Hilbert Space). In particular, we prese...
متن کاملDecentralized High-Dimensional Bayesian Optimization with Factor Graphs
This paper presents a novel decentralized high-dimensional Bayesian optimization (DEC-HBO) algorithm that, in contrast to existing HBO algorithms, can exploit the interdependent effects of various input components on the output of the unknown objective function f for boosting the BO performance and still preserve scalability in the number of input dimensions without requiring prior knowledge or...
متن کاملCapability of the Stochastic Seismic Inversion in Detecting the Thin Beds: a Case Study at One of the Persian Gulf Oilfields
The aim of seismic inversion is mapping all of the subsurface structures from seismic data. Due to the band-limited nature of the seismic data, it is difficult to find a unique solution for seismic inversion. Deterministic methods of seismic inversion are based on try and error techniques and provide a smooth map of elastic properties, while stochastic methods produce high-resolution maps of el...
متن کامل